Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 20 de 124
1.
Acad Med ; 2024 Apr 10.
Article En | MEDLINE | ID: mdl-38602892

ABSTRACT: Over the past decade, entrustable professional activities (EPAs) have become an important element in the competency-based medical education movement. In this Commentary, the authors explore informed consent as an EPA within resident surgical training. In doing so, they foreground the concept of culture and reexamine the nature of trust and entrustment decisions from within a cultural framework. The authors identify role modeling and professional identity formation as core elements in the training process and suggest that faculty are sometimes better off using these tools than uncritically adopting a formal EPA framework for what is, in essence, a professionally oriented and values-based moral enterprise. They conclude that EPAs work best when they are developed at a local level, stressing the unique specialty and program culture as well as the care that must be taken when attempting to transfer notions of entrustment from the undergraduate medical education level to graduate medical education settings.

2.
JAMA Surg ; 159(5): 546-552, 2024 May 01.
Article En | MEDLINE | ID: mdl-38477914

Importance: National data on the development of competence during training have been reported using the Accreditation Council for Graduate Medical Education (ACGME) Milestones system. It is now possible to consider longitudinal analyses that link Milestone ratings during training to patient outcomes data of recent graduates. Objective: To evaluate the association of in-training ACGME Milestone ratings in a surgical specialty with subsequent complication rates following a commonly performed operation, endovascular aortic aneurysm repair (EVAR). Design, Setting, and Participants: This study of patient outcomes followed EVAR in the Vascular Quality Initiative (VQI) registry (4213 admissions from 208 hospitals treated by 327 surgeons). All surgeons included in this study graduated from ACGME-accredited training programs from 2015 through 2019 and had Milestone ratings 6 months prior to graduation. Data were analyzed from December 1, 2021, through September 15, 2023. Because Milestone ratings can vary with program, they were corrected for program effect using a deviation score from the program mean. Exposure: Milestone ratings assigned to individual trainees 6 months prior to graduation, based on judgments of surgical competence. Main Outcomes and Measures: Surgical complications following EVAR for patients treated by recent graduates during the index hospitalization, obtained using the nationwide Society for Vascular Surgery Patient Safety Organization's VQI registry, which includes 929 participating centers in 49 US states. Results: The study included outcomes for 4213 patients (mean [SD] age, 73.25 [8.74] years; 3379 male participants [80.2%]). Postoperative complications included 9.5% major (400 of 4213 cases) and 30.2% minor (1274 of 4213 cases) complications. After adjusting for patient risk factors and site of training, a significant association was identified between individual Milestone ratings of surgical trainees and major complications in early surgical practice in programs with lower mean Milestone ratings (odds ratio, 0.50; 95% CI; 0.27-0.95). Conclusions and Relevance: In this study, Milestone assessments of surgical trainees were associated with subsequent clinical outcomes in their early career. Although these findings represent one surgical specialty, they suggest Milestone ratings can be used in any specialty to identify trainees at risk for future adverse patient outcomes when applying the same theory and methodology. Milestones data should inform data-driven educational interventions and trainee remediation to optimize future patient outcomes.


Accreditation , Clinical Competence , Education, Medical, Graduate , Endovascular Procedures , Postoperative Complications , Humans , Male , Female , Postoperative Complications/epidemiology , Endovascular Procedures/education , United States , Registries , Internship and Residency , Surgeons/education , Surgeons/standards , Aged , Middle Aged
3.
Perspect Med Educ ; 13(1): 95-107, 2024.
Article En | MEDLINE | ID: mdl-38343556

Program evaluation is an essential, but often neglected, activity in any transformational educational change. Competence by Design was a large-scale change initiative to implement a competency-based time-variable educational system in Canadian postgraduate medical education. A program evaluation strategy was an integral part of the build and implementation plan for CBD from the beginning, providing insights into implementation progress, challenges, unexpected outcomes, and impact. The Competence by Design program evaluation strategy was built upon a logic model and three pillars of evaluation: readiness to implement, fidelity and integrity of implementation, and outcomes of implementation. The program evaluation strategy harvested from both internally driven studies and those performed by partners and invested others. A dashboard for the program evaluation strategy was created to transparently display a real-time view of Competence by Design implementation and facilitate continuous adaptation and improvement. The findings of the program evaluation for Competence by Design drove changes to all aspects of the Competence by Design implementation, aided engagement of partners, supported change management, and deepened our understanding of the journey required for transformational educational change in a complex national postgraduate medical education system. The program evaluation strategy for Competence by Design provides a framework for program evaluation for any large-scale change in health professions education.


Competency-Based Education , Education, Medical , Humans , Canada , Program Evaluation , Curriculum
4.
Ann Surg ; 279(1): 180-186, 2024 01 01.
Article En | MEDLINE | ID: mdl-37436889

OBJECTIVE: To determine the relationship between, and predictive utility of, milestone ratings and subsequent American Board of Surgery (ABS) vascular surgery in-training examination (VSITE), vascular qualifying examination (VQE), and vascular certifying examination (VCE) performance in a national cohort of vascular surgery trainees. BACKGROUND: Specialty board certification is an important indicator of physician competence. However, predicting future board certification examination performance during training continues to be challenging. METHODS: This is a national longitudinal cohort study examining relational and predictive associations between Accreditation Council for Graduate Medical Education (ACGME) Milestone ratings and performance on VSITE, VQE, and VCE for all vascular surgery trainees from 2015 to 2021. Predictive associations between milestone ratings and VSITE were conducted using cross-classified random-effects regression. Cross-classified random-effects logistic regression was used to identify predictive associations between milestone ratings and VQE and VCE. RESULTS: Milestone ratings were obtained for all residents and fellows(n=1,118) from 164 programs during the study period (from July 2015 to June 2021), including 145,959 total trainee assessments. Medical knowledge (MK) and patient care (PC) milestone ratings were strongly predictive of VSITE performance across all postgraduate years (PGYs) of training, with MK ratings demonstrating a slightly stronger predictive association overall (MK coefficient 17.26 to 35.76, ß = 0.15 to 0.23). All core competency ratings were predictive of VSITE performance in PGYs 4 and 5. PGY 5 MK was highly predictive of VQE performance [OR 4.73, (95% CI, 3.87-5.78), P <0.001]. PC subcompetencies were also highly predictive of VQE performance in the final year of training [OR 4.14, (95% CI, 3.17-5.41), P <0.001]. All other competencies were also significantly predictive of first-attempt VQE pass with ORs of 1.53 and higher. PGY 4 ICS ratings [OR 4.0, (95% CI, 3.06-5.21), P <0.001] emerged as the strongest predictor of VCE first-attempt pass. Again, all subcompetency ratings remained significant predictors of first-attempt pass on CE with ORs of 1.48 and higher. CONCLUSIONS: ACGME Milestone ratings are highly predictive of future VSITE performance, and first-attempt pass achievement on VQE and VCE in a national cohort of surgical trainees.


Internship and Residency , Humans , United States , Longitudinal Studies , Educational Measurement , Clinical Competence , Education, Medical, Graduate , Accreditation
5.
Med Educ ; 58(1): 93-104, 2024 01.
Article En | MEDLINE | ID: mdl-37455291

BACKGROUND: The conceptualisation of medical competence is central to its use in competency-based medical education. Calls for 'fixed standards' with 'flexible pathways', recommended in recent reports, require competence to be well defined. Making competence explicit and measurable has, however, been difficult, in part due to a tension between the need for standardisation and the acknowledgment that medical professionals must also be valued as unique individuals. To address these conflicting demands, a multilayered conceptualisation of competence is proposed, with implications for the definition of standards and approaches to assessment. THE MODEL: Three layers are elaborated. This first is a core layer of canonical knowledge and skill, 'that, which every professional should possess', independent of the context of practice. The second layer is context-dependent knowledge, skill, and attitude, visible through practice in health care. The third layer of personalised competence includes personal skills, interests, habits and convictions, integrated with one's personality. This layer, discussed with reference to Vygotsky's concept of Perezhivanie, cognitive load theory, self-determination theory and Maslow's 'self-actualisation', may be regarded as the art of medicine. We propose that fully matured professional competence requires all three layers, but that the assessment of each layer is different. IMPLICATIONS: The assessment of canonical knowledge and skills (Layer 1) can be approached with classical psychometric conditions, that is, similar tests, circumstances and criteria for all. Context-dependent medical competence (Layer 2) must be assessed differently, because conditions of assessment across candidates cannot be standardised. Here, multiple sources of information must be merged and intersubjective expert agreement should ground decisions about progression and level of clinical autonomy of trainees. Competence as the art of medicine (Layer 3) cannot be standardised and should not be assessed with the purpose of permission to practice. The pursuit of personal excellence in this level, however, can be recognised and rewarded.


Medicine , Professional Competence , Humans , Attitude , Delivery of Health Care , Psychometrics , Clinical Competence
6.
Anat Sci Educ ; 17(2): 433-443, 2024 Mar.
Article En | MEDLINE | ID: mdl-38108595

Haptic perception is used in the anatomy laboratory with the handling of three-dimensional (3D) prosections, dissections, and synthetic models of anatomical structures. Vision-based spatial ability has been found to correlate with performance on tests of 3D anatomy knowledge in previous studies. The objective was to explore whether haptic-based spatial ability was correlated with vision-based spatial ability. Vision-based spatial ability was measured in a study group of 49 medical graduates with three separate tests: a redrawn Vandenberg and Kuse Mental Rotations Tests in two (MRT A) and three (MRT C) dimensions and a Surface Development Test (SDT). Haptic-based spatial ability was measured using 18 different objects constructed from 10 cubes glued together. Participants were asked to draw these objects from blind haptic perception, and drawings were scored by two independent judges. The maximum score was 24 for each of MRT A and MRT C, 60 for SDT, and 18 for the drawings. The drawing score based on haptic perception [median = 17 (lower quartile = 16, upper quartile = 18)] correlated with MRT A [14 (9, 17)], MRT C [9 (7, 12)] and SDT [44 (36, 52)] scores with a Spearman's rank correlation coefficient of 0.395 (p = 0.0049), 0.507 (p = 0.0002) and 0.606 (p < 0.0001), respectively. Spatial abilities assessed by vision-based tests were correlated with a drawing score based on haptic perception of objects. Future research should investigate the contribution of haptic-based and vision-based spatial abilities on learning 3D anatomy from physical models.


Anatomy , Education, Medical, Undergraduate , Spatial Navigation , Humans , Stereognosis , Anatomy/education , Learning , Education, Medical, Undergraduate/methods , Space Perception
7.
Perspect Med Educ ; 12(1): 507-516, 2023.
Article En | MEDLINE | ID: mdl-37954041

The widespread adoption of Competency-Based Medical Education (CBME) has resulted in a more explicit focus on learners' abilities to effectively demonstrate achievement of the competencies required for safe and unsupervised practice. While CBME implementation has yielded many benefits, by focusing explicitly on what learners are doing, curricula may be unintentionally overlooking who learners are becoming (i.e., the formation of their professional identities). Integrating professional identity formation (PIF) into curricula has the potential to positively influence professionalism, well-being, and inclusivity; however, issues related to the definition, assessment, and operationalization of PIF have made it difficult to embed this curricular imperative into CBME. This paper aims to outline a path towards the reconciliation of PIF and CBME to better support the development of physicians that are best suited to meet the needs of society. To begin to reconcile CBME and PIF, this paper defines three contradictions that must and can be resolved, namely: (1) CBME attends to behavioral outcomes whereas PIF attends to developmental processes; (2) CBME emphasizes standardization whereas PIF emphasizes individualization; (3) CBME organizes assessment around observed competence whereas the assessment of PIF is inherently more holistic. Subsequently, the authors identify curricular opportunities to address these contradictions, such as incorporating process-based outcomes into curricula, recognizing the individualized and contextualized nature of competence, and incorporating guided self-assessment into coaching and mentorship programs. In addition, the authors highlight future research directions related to each contradiction with the goal of reconciling 'doing' and 'being' in medical education.


Education, Medical , Social Identification , Humans , Competency-Based Education/methods , Curriculum , Professionalism
8.
JAMA Netw Open ; 6(4): e237588, 2023 04 03.
Article En | MEDLINE | ID: mdl-37040112

Importance: Evaluation of trainees in graduate medical education training programs using Milestones has been in place since 2013. It is not known whether trainees who have lower ratings during the last year of training go on to have concerns related to interactions with patients in posttraining practice. Objective: To investigate the association between resident Milestone ratings and posttraining patient complaints. Design, Setting, and Participants: This retrospective cohort study included physicians who completed Accreditation Council for Graduate Medical Education (ACGME)-accredited programs between July 1, 2015, and June 30, 2019, and worked at a site that participated in the national Patient Advocacy Reporting System (PARS) program for at least 1 year. Milestone ratings from ACGME training programs and patient complaint data from PARS were collected. Data analysis was conducted from March 2022 to February 2023. Exposures: Lowest professionalism (P) and interpersonal and communication skills (ICS) Milestones ratings 6 months prior to the end of training. Main Outcomes and Measures: PARS year 1 index scores, based on recency and severity of complaints. Results: The cohort included 9340 physicians with median (IQR) age of 33 (31-35) years; 4516 (48.4%) were women physicians. Overall, 7001 (75.0%) had a PARS year 1 index score of 0, 2023 (21.7%) had a score of 1 to 20 (moderate), and 316 (3.4%) had a score of 21 or greater (high). Among physicians in the lowest Milestones group, 34 of 716 (4.7%) had high PARS year 1 index scores, while 105 of 3617 (2.9%) with Milestone ratings of 4.0 (proficient), had high PARS year 1 index scores. In a multivariable ordinal regression model, physicians in the 2 lowest Milestones rating groups (0-2.5 and 3.0-3.5) were statistically significantly more likely to have higher PARS year 1 index scores than the reference group with Milestones ratings of 4.0 (0-2.5 group: odds ratio, 1.2 [95% CI, 1.0-1.5]; 3.0-3.5 group: odds ratio, 1.2 [95% CI, 1.1-1.3]). Conclusions and Relevance: In this study, trainees with low Milestone ratings in P and ICS near the end of residency were at increased risk for patient complaints in their early posttraining independent physician practice. Trainees with lower Milestone ratings in P and ICS may need more support during graduate medical education training or in the early part of their posttraining practice career.


Internship and Residency , Physicians , Humans , Female , Adult , Male , Retrospective Studies , Clinical Competence , Education, Medical, Graduate
9.
J Contin Educ Health Prof ; 43(1): 52-59, 2023 01 01.
Article En | MEDLINE | ID: mdl-36849429

ABSTRACT: The information systems designed to support clinical care have evolved separately from those that support health professions education. This has resulted in a considerable digital divide between patient care and education, one that poorly serves practitioners and organizations, even as learning becomes ever more important to both. In this perspective, we advocate for the enhancement of existing health information systems so that they intentionally facilitate learning. We describe three well-regarded frameworks for learning that can point toward how health care information systems can best evolve to support learning. The Master Adaptive Learner model suggests ways that the individual practitioner can best organize their activities to ensure continual self-improvement. The PDSA cycle similarly proposes actions for improvement but at a health care organization's workflow level. Senge's Five Disciplines of the Learning Organization, a more general framework from the business literature, serves to further inform how disparate information and knowledge flows can be managed for continual improvement. Our main thesis holds that these types of learning frameworks should inform the design and integration of information systems serving the health professions. An underutilized mediator of educational improvement is the ubiquitous electronic health record. The authors list learning analytic opportunities, including potential modifications of learning management systems and the electronic health record, that would enhance health professions education and support the shared goal of delivering high-quality evidence-based health care.


Electronic Health Records , Learning , Humans , Health Occupations , Knowledge
10.
Acad Med ; 98(7): 813-820, 2023 07 01.
Article En | MEDLINE | ID: mdl-36724304

PURPOSE: Accurate assessment of clinical performance is essential to ensure graduating residents are competent for unsupervised practice. The Accreditation Council for Graduate Medical Education milestones framework is the most widely used competency-based framework in the United States. However, the relationship between residents' milestones competency ratings and their subsequent early career clinical outcomes has not been established. It is important to examine the association between milestones competency ratings of U.S. general surgical residents and those surgeons' patient outcomes in early career practice. METHOD: A retrospective, cross-sectional study was conducted using a sample of national Medicare claims for 23 common, high-risk inpatient general surgical procedures performed between July 1, 2015, and November 30, 2018 (n = 12,400 cases) by nonfellowship-trained U.S. general surgeons. Milestone ratings collected during those surgeons' last year of residency (n = 701 residents) were compared with their risk-adjusted rates of mortality, any complication, or severe complication within 30 days of index operation during their first 2 years of practice. RESULTS: There were no associations between mean milestone competency ratings of graduating general surgery residents and their subsequent early career patient outcomes, including any complication (23% proficient vs 22% not yet proficient; relative risk [RR], 0.97, [95% CI, 0.88-1.08]); severe complication (9% vs 9%, respectively; RR, 1.01, [95% CI, 0.86-1.19]); and mortality (5% vs 5%; RR, 1.07, [95% CI, 0.88-1.30]). Secondary analyses yielded no associations between patient outcomes and milestone ratings specific to technical performance, or between patient outcomes and composites of operative performance, professionalism, or leadership milestones ratings ( P ranged .32-.97). CONCLUSIONS: Milestone ratings of graduating general surgery residents were not associated with the patient outcomes of those surgeons when they performed common, higher-risk procedures in a Medicare population. Efforts to improve how milestones ratings are generated might strengthen their association with early career outcomes.


Internship and Residency , Aged , Humans , United States , Retrospective Studies , Cross-Sectional Studies , Clinical Competence , Medicare , Education, Medical, Graduate/methods , Accreditation , Educational Measurement/methods
11.
Ann Surg ; 277(4): e971-e977, 2023 04 01.
Article En | MEDLINE | ID: mdl-35129524

OBJECTIVE: This study aims to investigate at-risk scores of semiannual Accreditation Council for Graduate Medical Education (ACGME) Milestone ratings for vascular surgical trainees' final achievement of competency targets. SUMMARY BACKGROUND DATA: ACGME Milestones assessments have been collected since 2015 for Vascular Surgery. It is unclear whether milestone ratings throughout training predict achievement of recommended performance targets upon graduation. METHODS: National ACGME Milestones data were utilized for analyses. All trainees completing 2-year vascular surgery fellowships in June 2018 and 5-year integrated vascular surgery residencies in June 2019 were included. A generalized estimating equations model was used to obtain at-risk scores for each of the 31 subcompetencies by semiannual review periods, to estimate the probability of trainees achieving the recommended graduation target based on their previous ratings. RESULTS: A total of 122 vascular surgery fellows (VSFs) (95.3%) and 52 integrated vascular surgery residents (IVSRs) (100%) were included. VSFs and IVSRs did not achieve level 4.0 competency targets at a rate of 1.6% to 25.4% across subcompetencies, which was not significantly different between the 2 groups for any of the subcompetencies ( P = 0.161-0.999). Trainees were found to be at greater risk of not achieving competency targets when lower milestone ratings were assigned, and at later time-points in training. At a milestone rating of 2.5, with 1 year remaining before graduation, the at-risk score for not achieving the target level 4.0 milestone ranged from 2.9% to 77.9% for VSFs and 33.3% to 75.0% for IVSRs. CONCLUSION: The ACGME Milestones provide early diagnostic and predictive information for vascular surgery trainees' achievement of competence at completion of training.


Internship and Residency , Humans , Educational Measurement , Clinical Competence , Education, Medical, Graduate , Accreditation , Vascular Surgical Procedures
12.
J Surg Educ ; 80(2): 235-246, 2023 02.
Article En | MEDLINE | ID: mdl-36182635

OBJECTIVE: Program directors in surgical disciplines need more tools from the ACGME to help them use Milestone ratings to improve trainees' performance. This is especially true in competencies that are notoriously difficult to measure, such as professionalism (PROF) and interpersonal and communication skills (ICS). It is now widely understood that skills in these two areas have direct impact on patient care outcomes. This study investigated the potential for generating early predictors of final Milestone ratings within the PROF and ICS competency categories. DESIGN: This retrospective cohort study utilized Milestone ratings from all ACGME-accredited vascular surgery training programs, covering residents and fellows who completed training in June 2019. The outcome measure studied was the rate of achieving the recommended graduation target of Milestone Level 4 (possible range: 1-5), while the predictors were the Milestone ratings attained at earlier stages of training. Predictive probability values (PPVs) were calculated for each of the 3 PROF and two ICS sub-competencies to estimate the probability of trainees not reaching the recommended graduation target based on their previous Milestone ratings. SETTING: All ACGME-accredited vascular surgery training programs within the United States. PARTICIPANTS: All trainees completing a 2 year vascular surgery fellowship (VSF) in June 2019 (n = 119) or a 5 year integrated vascular surgery residency (IVSR) in June 2019 (n = 52) were included in the analyses. RESULTS: The overall rate of failing to achieve the recommended graduation target across all PROF and ICS sub-competencies ranged from 7.7% to 21.8% of all trainees. For trainees with a Milestone rating at ≤ 2.5 with 1 year remaining in their training program, the predictive probability of not achieving the recommended graduation target ranged from 37.0% to 71.5% across sub-competencies, with the highest risks observed under PROF for "Administrative Tasks" (71.5%) and under ICS for "Communication with the Healthcare Team" (56.7%). CONCLUSIONS: As many as 1 in 4 vascular surgery trainees did not achieve the ACGME vascular surgery Milestones targets for graduation in at least one of the PROF and ICS sub-competencies. Biannual ACGME Milestone assessment ratings of PROF and ICS during early training can be used to predict achievement of competency targets at time of graduation. Early clues to problems in PROF and ICS enable programs to address potential deficits early in training to ensure competency in these essential non-technical skills prior to entering unsupervised practice.


Internship and Residency , Humans , United States , Educational Measurement , Professionalism , Retrospective Studies , Education, Medical, Graduate , Clinical Competence , Communication , Vascular Surgical Procedures
13.
Med Teach ; 44(8): 886-892, 2022 08.
Article En | MEDLINE | ID: mdl-36083123

PURPOSE: Organizational readiness is critical for successful implementation of an innovation. We evaluated program readiness to implement Competence by Design (CBD), a model of Competency-Based Medical Education (CBME), among Canadian postgraduate training programs. METHODS: A survey of program directors was distributed 1 month prior to CBD implementation in 2019. Questions were informed by the R = MC2 framework of organizational readiness and addressed: program motivation, general capacity for change, and innovation-specific capacity. An overall readiness score was calculated. An ANOVA was conducted to compare overall readiness between disciplines. RESULTS: Survey response rate was 42% (n = 79). The mean overall readiness score was 74% (30-98%). There was no difference in scores between disciplines. The majority of respondents agreed that successful implementation of CBD was a priority (74%), and that their leadership (94%) and faculty and residents (87%) were supportive of change. Fewer perceived that CBD was a move in the right direction (58%) and that implementation was a manageable change (53%). Curriculum mapping, competence committees and programmatic assessment activities were completed by >90% of programs, while <50% had engaged off-service disciplines. CONCLUSION: Our study highlights important areas where programs excelled in their preparation for CBD, as well as common challenges that serve as targets for future intervention to improve program readiness for CBD implementation.


Competency-Based Education , Education, Medical , Canada , Curriculum , Humans , Leadership
14.
J Vasc Surg ; 76(5): 1388-1397, 2022 11.
Article En | MEDLINE | ID: mdl-35798280

BACKGROUND: The quality and effectiveness of vascular surgery education should be evaluated based on patient care outcomes. To investigate predictive associations between trainee performance and subsequent patient outcomes, a critical first step is to determine the conceptual alignment of educational competencies with clinical outcomes in practice. We sought to generate expert consensus on the conceptual alignment of the Accreditation Council for Graduate Medical Education (ACGME) Vascular Surgery subcompetencies with patient care outcomes across different Vascular Quality Initiative (VQI) registries. METHODS: A national panel of vascular surgeons with expertise in both clinical care and education were recruited to participate in a modified Delphi expert consensus building process to map ACGME Vascular Surgery subcompetencies (educational markers of resident performance) to VQI clinical modules (patient outcomes). A master list of items for rating was created, including the 31 ACGME Vascular Surgery subcompetencies and 8 VQI clinical registries (endovascular abdominal aortic aneurysm repair, open abdominal aortic aneurysm, thoracic endovascular aortic repair, carotid endarterectomy, carotid artery stent, infrainguinal, suprainguinal, and peripheral vascular intervention). These items were entered into an iterative Delphi process. Positive consensus was reached when 75% or more of the participants ranked an item as mandatory. Intraclass correlations (ICCs) were used to evaluate consistency between experts for each Delphi round. RESULTS: A total of 13 experts who contributed to the development of the Vascular Surgery Milestones participated; 12 experts (92%) participated in both rounds of the Delphi process. Two rounds of Delphi were conducted, as suggested by excellent expert agreement (round 1, ICC = 0.79 [95% confidence interval, 0.74-0.84]; round 2, ICC = 0.97 [95% confidence interval, 0.960-.98]). Using the predetermined consensus cutoff threshold, the Delphi process reduced the number of subcompetencies mapped to patient care outcomes from 31 to a range of 9 to 15 across the 8 VQI clinical registries. Practice-based learning and improvement, and professionalism subcompetencies were identified as less relevant to patient outcome variables captured by the VQI registries after the final round, and the only the systems-based practice subcompetency that was identified as relevant was radiation safety in two of the endovascular registries. CONCLUSIONS: A national panel of vascular surgeon experts reported a high degree of agreement on the relevance of ACGME subcompetencies to patient care outcomes as captured in the VQI clinical registry. Systems-based practice, practice-based learning and improvement, and professionalism competencies were identified as less relevant to patient outcomes after specific surgical procedures.


Aortic Aneurysm, Abdominal , Humans , Aortic Aneurysm, Abdominal/surgery , Consensus , Clinical Competence , Education, Medical, Graduate , Vascular Surgical Procedures/education , Accreditation
15.
Acad Med ; 97(4): 569-576, 2022 04 01.
Article En | MEDLINE | ID: mdl-34192718

PURPOSE: To investigate whether milestone data obtained from clinical competency committee (CCC) ratings in a single specialty reflected the 6 general competency domains framework. METHOD: The authors examined milestone ratings from all 275 U.S. Accreditation Council for Graduate Medical Education-accredited categorical obstetrics and gynecology (OBGYN) programs from July 1, 2018, to June 30, 2019. The sample size ranged from 1,371 to 1,438 residents from 275 programs across 4 postgraduate years (PGYs), each with 2 assessment periods. The OBGYN milestones reporting form consisted of 28 subcompetencies under the 6 general competency domains. Milestone ratings were determined by each program's CCC. Intraclass correlations (ICCs) and design effects were calculated for each subcompetency by PGY and assessment period. A multilevel confirmatory factor analysis (CFA) perspective was used, and the pooled within-program covariance matrix was obtained to compare the fit of the 6-domain factor model against 3 other plausible models. RESULTS: Milestone ratings from 5,618 OBGYN residents were examined. Moderate to high ICCs and design effects greater than 2.0 were prevalent among all subcompetencies for both assessment periods, warranting the use of the multilevel approach in applying CFA to the milestone data. The theory-aided split-patient care (PC) factor model, which used the 6 general competency domains but also included 3 factors within the PC domain (obstetric technical skills, gynecology technical skills, and ambulatory care), was consistently shown as the best-fitting model across all PGYs by assessment period conditions, except for one. CONCLUSIONS: The findings indicate that in addition to using the 6 general competency domains framework in their rating process, CCCs may have further distinguished the PC competency domain into 3 meaningful factors. This study provides internal structure validity evidence for the milestones within a single specialty and may shed light on CCCs' understanding of the distinctive content embedded within the milestones.


Clinical Competence , Internship and Residency , Accreditation , Education, Medical, Graduate , Educational Measurement , Humans
16.
JAMA Netw Open ; 4(12): e2137179, 2021 12 01.
Article En | MEDLINE | ID: mdl-34874406

Importance: Longitudinal Milestones data reported to the Accreditation Council for Graduate Medical Education (ACGME) can be used to measure the developmental and educational progression of learners. Learning trajectories illustrate the pattern and rate at which learners acquire competencies toward unsupervised practice. Objective: To investigate the reliability of learning trajectories and patterns of learning progression that can support meaningful intervention and remediation for residents. Design, Setting, and Participants: This national retrospective cohort study included Milestones data from residents in family medicine, representing 6 semi-annual reporting periods from July 2016 to June 2019. Interventions: Longitudinal formative assessment using the Milestones assessment system reported to the ACGME. Main Outcomes and Measures: To estimate longitudinal consistency, growth rate reliability (GRR) and growth curve reliability (GCR) for 22 subcompetencies in the ACGME family medicine Milestones were used, incorporating clustering effects at the program level. Latent class growth curve models were used to examine longitudinal learning trajectories. Results: This study included Milestones ratings from 3872 residents in 514 programs. The Milestones reporting system reliably differentiated individual longitudinal patterns for formative purposes (mean [SD] GRR, 0.63 [0.03]); there was also evidence of precision for model-based rates of change (mean [SD] GCR, 0.91 [0.02]). Milestones ratings increased significantly across training years and reporting periods (mean [SD] of 0.55 [0.04] Milestones units per reporting period; P < .001); patterns of developmental progress varied by subcompetency. There were 3 or 4 distinct patterns of learning trajectories for each of the 22 subcompetencies. For example, for the professionalism subcompetency, residents were classified to 4 groups of learning trajectories; during the 3-year family medicine training period, trajectories diverged further after postgraduate year (PGY) 1, indicating a potential remediation point between the end of PGY 1 and the beginning of PGY 2 for struggling learners, who represented 16% of learners (620 residents). Similar inferences for learning trajectories were found for practice-based learning and improvement, systems-based practice, and interpersonal and communication skills. Subcompetencies in medical knowledge and patient care demonstrated more consistent patterns of upward growth. Conclusions and Relevance: These findings suggest that the Milestones reporting system provides reliable longitudinal data for individualized tracking of progress in all subcompetencies. Learning trajectories with supporting reliability evidence could be used to understand residents' developmental progress and tailored for individualized learning plans and remediation.


Clinical Competence/standards , Competency-Based Education/standards , Family Practice/education , Internship and Residency/standards , Education, Medical, Graduate/standards , Humans , Retrospective Studies
17.
J Grad Med Educ ; 13(3): 404-410, 2021 Jun.
Article En | MEDLINE | ID: mdl-34178266

BACKGROUND: The American Medical Association Accelerating Change in Medical Education (AMA-ACE) consortium proposes that medical schools include a new 3-pillar model incorporating health systems science (HSS) and basic and clinical sciences. One of the goals of AMA-ACE was to support HSS curricular innovation to improve residency preparation. OBJECTIVE: This study evaluates the effectiveness of HSS curricula by using a large dataset to link medical school graduates to internship Milestones through collaboration with the Accreditation Council for Graduate Medical Education (ACGME). METHODS: ACGME subcompetencies related to the schools' HSS curricula were identified for internal medicine, emergency medicine, family medicine, obstetrics and gynecology (OB/GYN), pediatrics, and surgery. Analysis compared Milestone ratings of ACE school graduates to non-ACE graduates at 6 and 12 months using generalized estimating equation models. RESULTS: At 6 months both groups demonstrated similar HSS-related levels of Milestone performance on the selected ACGME competencies. At 1 year, ACE graduates in OB/GYN scored minimally higher on 2 systems-based practice (SBP) subcompetencies compared to non-ACE school graduates: SBP01 (1.96 vs 1.82, 95% CI 0.03-0.24) and SBP02 (1.87 vs 1.79, 95% CI 0.01-0.16). In internal medicine, ACE graduates scored minimally higher on 3 HSS-related subcompetencies: SBP01 (2.19 vs 2.05, 95% CI 0.04-0.26), PBLI01 (2.13 vs 2.01; 95% CI 0.01-0.24), and PBLI04 (2.05 vs 1.93; 95% CI 0.03-0.21). For the other specialties examined, there were no significant differences between groups. CONCLUSIONS: Graduates from schools with training in HSS had similar Milestone ratings for most subcompetencies and very small differences in Milestone ratings for only 5 subcompetencies across 6 specialties at 1 year, compared to graduates from non-ACE schools. These differences are likely not educationally meaningful.


Internship and Residency , Accreditation , Child , Clinical Competence , Education, Medical, Graduate , Educational Measurement , Humans , United States
18.
Acad Med ; 96(9): 1324-1331, 2021 09 01.
Article En | MEDLINE | ID: mdl-34133345

PURPOSE: The United States Medical Licensing Examination (USMLE) sequence and the Accreditation Council for Graduate Medical Education (ACGME) milestones represent 2 major components along the continuum of assessment from undergraduate through graduate medical education. This study examines associations between USMLE Step 1 and Step 2 Clinical Knowledge (CK) scores and ACGME emergency medicine (EM) milestone ratings. METHOD: In February 2019, subject matter experts (SMEs) provided judgments of expected associations for each combination of Step examination and EM subcompetency. The resulting sets of subcompetencies with expected strong and weak associations were selected for convergent and discriminant validity analysis, respectively. National-level data for 2013-2018 were provided; the final sample included 6,618 EM residents from 158 training programs. Empirical bivariate correlations between milestone ratings and Step scores were calculated, then those correlations were compared with the SMEs' judgments. Multilevel regression analyses were conducted on the selected subcompetencies, in which milestone ratings were the dependent variable, and Step 1 score, Step 2 CK score, and cohort year were independent variables. RESULTS: Regression results showed small but statistically significant positive relationships between Step 2 CK score and the subcompetencies (regression coefficients ranged from 0.02 [95% confidence interval (CI), 0.01-0.03] to 0.12 [95% CI, 0.11-0.13]; all P < .05), with the degree of association matching the SMEs' judgments for 7 of the 9 selected subcompetencies. For example, a 1 standard deviation increase in Step 2 CK score predicted a 0.12 increase in MK-01 milestone rating, when controlling for Step 1. Step 1 score showed a small statistically significant effect with only the MK-01 subcompetency (regression coefficient = 0.06 [95% CI, 0.05-0.07], P < .05). CONCLUSIONS: These results provide incremental validity evidence in support of Step 1 and Step 2 CK score and EM milestone rating uses.


Clinical Competence/statistics & numerical data , Education, Medical, Graduate/statistics & numerical data , Educational Measurement/statistics & numerical data , Emergency Medicine/statistics & numerical data , Internship and Residency/statistics & numerical data , Accreditation , Adult , Educational Measurement/methods , Emergency Medicine/education , Female , Humans , Licensure, Medical , Male , Middle Aged , Multilevel Analysis , Regression Analysis , Reproducibility of Results , United States , Young Adult
19.
Med Teach ; 43(7): 780-787, 2021 Jul.
Article En | MEDLINE | ID: mdl-34020576

Health care revolves around trust. Patients are often in a position that gives them no other choice than to trust the people taking care of them. Educational programs thus have the responsibility to develop physicians who can be trusted to deliver safe and effective care, ultimately making a final decision to entrust trainees to graduate to unsupervised practice. Such entrustment decisions deserve to be scrutinized for their validity. This end-of-training entrustment decision is arguably the most important one, although earlier entrustment decisions, for smaller units of professional practice, should also be scrutinized for their validity. Validity of entrustment decisions implies a defensible argument that can be analyzed in components that together support the decision. According to Kane, building a validity argument is a process designed to support inferences of scoring, generalization across observations, extrapolation to new instances, and implications of the decision. A lack of validity can be caused by inadequate evidence in terms of, according to Messick, content, response process, internal structure (coherence) and relationship to other variables, and in misinterpreted consequences. These two leading frameworks (Kane and Messick) in educational and psychological testing can be well applied to summative entrustment decision-making. The authors elaborate the types of questions that need to be answered to arrive at defensible, well-argued summative decisions regarding performance to provide a grounding for high-quality safe patient care.


Internship and Residency , Physicians , Clinical Competence , Competency-Based Education , Decision Making , Humans , Trust
20.
J Grad Med Educ ; 13(2 Suppl): 14-44, 2021 Apr.
Article En | MEDLINE | ID: mdl-33936531

BACKGROUND: Since 2013, US residency programs have used the competency-based framework of the Milestones to report resident progress and to provide feedback to residents. The implementation of Milestones-based assessments, clinical competency committee (CCC) meetings, and processes for providing feedback varies among programs and warrants systematic examination across specialties. OBJECTIVE: We sought to determine how varying assessment, CCC, and feedback implementation strategies result in different outcomes in resource expenditure and stakeholder engagement, and to explore the contextual forces that moderate these outcomes. METHODS: From 2017 to 2018, interviews were conducted of program directors, CCC chairs, and residents in emergency medicine (EM), internal medicine (IM), pediatrics, and family medicine (FM), querying their experiences with Milestone processes in their respective programs. Interview transcripts were coded using template analysis, with the initial template derived from previous research. The research team conducted iterative consensus meetings to ensure that the evolving template accurately represented phenomena described by interviewees. RESULTS: Forty-four individuals were interviewed across 16 programs (5 EM, 4 IM, 5 pediatrics, 3 FM). We identified 3 stages of Milestone-process implementation, including a resource-intensive early stage, an increasingly efficient transition stage, and a final stage for fine-tuning. CONCLUSIONS: Residency program leaders can use these findings to place their programs along an implementation continuum and gain an understanding of the strategies that have enabled their peers to progress to improved efficiency and increased resident and faculty engagement.


Internship and Residency , Population Health , Child , Clinical Competence , Competency-Based Education , Educational Measurement , Humans , Internal Medicine/education
...